Motivation: Improving Simple Sorting

Consider a basic sorting algorithm like Selection Sort. It repeatedly finds the largest element in the unsorted portion of an array and swaps it to the end.

  • Finding the largest element takes $O(n)$ time.
  • This is repeated $n$ times, leading to an overall time complexity of $O(n^2)$.

The bottleneck is repeatedly finding the maximum. What if we could find the max faster? This is where heaps come in. By organizing the array into a max-heap, we can get the largest element in $O(1)$ and maintain the heap property in $O(\log n)$ time.

The Heap Sort Approach

  • Algorithm Concept:
    1. Transform the unsorted array into a max-heap
    2. The largest element is at the root of the heap
    3. Swap it with the last element of the heap
    4. Reduce heap size and restore heap property
    5. Repeat until the array is sorted
  • Algorithm Steps:
    1. Build Max-Heap: Start from the last non-leaf node ($O(n)$)
    2. Repeatedly Extract Maximum:
      • Swap root element with last heap element
      • Decrease heap size by 1
      • Perform heapify on new root to maintain heap property
      • Repeat n-1 times ($O(n \log n)$)
  • Sorted elements accumulate at the end in ascending order
  • Overall Time Complexity: $O(n \log n)$ - efficient comparison-based sort

Time Complexity Analysis

Case Time Complexity Description
Worst Case $O(n \log n)$ Guaranteed performance for any input
Average Case $O(n \log n)$ Typical performance
Best Case $O(n \log n)$ Even for sorted input (must build heap)
Space Complexity $O(1)$ In-place sorting algorithm

\[ \text{Heap Sort Complexity} = O(n) + O(n \log n) = O(n \log n) \]

Algorithm Properties

  • In-place: Requires only constant extra space
  • Unstable: Relative order of equal elements may change
  • Comparison-based: Relies on element comparisons
  • Ideal for: Scenarios requiring guaranteed $O(n \log n)$ performance
Max-Heap (shrinking)
Sorted (growing)
Initial State: Unsorted Array [4, 10, 3, 5, 1]